203 research outputs found
Doubly-Affine Extractors, and Their Applications
In this work we challenge the common misconception that information-theoretic (IT) privacy is too impractical to be used in the real-world: we propose to build simple and reusable IT-encryption solutions whose only efficiency penalty (compared to computationally-secure schemes) comes from a large secret key size, which is often a rather minor inconvenience, as storage is cheap. In particular, our solutions are stateless and locally computable at the optimal rate, meaning that honest parties do not maintain state and read only (optimally) small portions of their large keys with every use.
Moreover, we also propose a novel architecture for outsourcing the storage of these long keys to a network of semi-trusted servers, trading the need to store large secrets with the assumption that it is hard to simultaneously compromise too many publicly accessible ad-hoc servers. Our architecture supports everlasting privacy and post-application security of the derived one-time keys, resolving two major limitations of a related model for outsourcing key storage, called bounded storage model.
Both of these results come from nearly optimal constructions of so called doubly-affine extractors: locally-computable, seeded extractors Ext(X,S) which are linear functions of X (for any fixed seed S), and protect against bounded affine leakage on X. This holds unconditionally, even if (a) affine leakage may adaptively depend on the extracted key R = Ext(X,S); and (b) the seed S is only computationally secure. Neither of these properties are possible with general-leakage extractors
Beating Shannon requires BOTH efficient adversaries AND non-zero advantage
In this note we formally show a folklore (but, to the best of our knowledge, not documented) fact that in order to beat the famous Shannon lower bound on key length for one-time-secure encryption, one must *simultaneously* restrict the attacker to be efficient, and also allow the attacker to break the system with some non-zero (i.e., negligible) probability. Despite being folklore , we were unable to find a clean and simple proof of this result, despite asking several experts in the field. We hope that cryptography instructors will find this note useful when justifying the transition from information-theoretic to computational cryptography.
We note that our proof cleanly handles *probabilistic* encryption, as well as a small *decryption error*
Multiparty Quantum Coin Flipping
We investigate coin-flipping protocols for multiple parties in a quantum
broadcast setting:
(1) We propose and motivate a definition for quantum broadcast. Our model of
quantum broadcast channel is new.
(2) We discovered that quantum broadcast is essentially a combination of
pairwise quantum channels and a classical broadcast channel. This is a somewhat
surprising conclusion, but helps us in both our lower and upper bounds.
(3) We provide tight upper and lower bounds on the optimal bias epsilon of a
coin which can be flipped by k parties of which exactly g parties are honest:
for any 1 <= g <= k, epsilon = 1/2 - Theta(g/k).
Thus, as long as a constant fraction of the players are honest, they can
prevent the coin from being fixed with at least a constant probability. This
result stands in sharp contrast with the classical setting, where no
non-trivial coin-flipping is possible when g <= k/2.Comment: v2: bounds now tight via new protocol; to appear at IEEE Conference
on Computational Complexity 200
Small-Box Cryptography
One of the ultimate goals of symmetric-key cryptography is to find a rigorous theoretical framework for building block ciphers from small components, such as cryptographic S-boxes, and then argue why iterating such small components for sufficiently many rounds would yield a secure construction. Unfortunately, a fundamental obstacle towards reaching this goal comes from the fact that traditional security proofs cannot get security beyond 2^{-n}, where n is the size of the corresponding component.
As a result, prior provably secure approaches - which we call "big-box cryptography" - always made n larger than the security parameter, which led to several problems: (a) the design was too coarse to really explain practical constructions, as (arguably) the most interesting design choices happening when instantiating such "big-boxes" were completely abstracted out; (b) the theoretically predicted number of rounds for the security of this approach was always dramatically smaller than in reality, where the "big-box" building block could not be made as ideal as required by the proof. For example, Even-Mansour (and, more generally, key-alternating) ciphers completely ignored the substitution-permutation network (SPN) paradigm which is at the heart of most real-world implementations of such ciphers.
In this work, we introduce a novel paradigm for justifying the security of existing block ciphers, which we call small-box cryptography. Unlike the "big-box" paradigm, it allows one to go much deeper inside the existing block cipher constructions, by only idealizing a small (and, hence, realistic!) building block of very small size n, such as an 8-to-32-bit S-box. It then introduces a clean and rigorous mixture of proofs and hardness conjectures which allow one to lift traditional, and seemingly meaningless, "at most 2^{-n}" security proofs for reduced-round idealized variants of the existing block ciphers, into meaningful, full-round security justifications of the actual ciphers used in the real world.
We then apply our framework to the analysis of SPN ciphers (e.g, generalizations of AES), getting quite reasonable and plausible concrete hardness estimates for the resulting ciphers. We also apply our framework to the design of stream ciphers. Here, however, we focus on the simplicity of the resulting construction, for which we managed to find a direct "big-box"-style security justification, under a well studied and widely believed eXact Linear Parity with Noise (XLPN) assumption.
Overall, we hope that our work will initiate many follow-up results in the area of small-box cryptography
Space-time tradoffs for graph properties
Thesis (M.S.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1998.Includes bibliographical references (p. 83-84).by Yevgeniy Dodis.M.S
Randomness Condensers for Efficiently Samplable, Seed-Dependent Sources
We initiate a study of randomness condensers for sources that are efficiently samplable but may depend on the seed of the con- denser. That is, we seek functions Cond : {0, 1}n Ă{0, 1}d â {0, 1}m such that if we choose a random seed S â {0,1}d, and a source X = A(S) is generated by a randomized circuit A of size t such that X has min- entropy at least k given S, then Cond(X;S) should have min-entropy at least some kâČ given S. The distinction from the standard notion of ran- domness condensers is that the source X may be correlated with the seed S (but is restricted to be efficiently samplable). Randomness extractors of this type (corresponding to the special case where kâČ = m) have been implicitly studied in the past (by Trevisan and Vadhan, FOCS â00). We show that:
â Unlike extractors, we can have randomness condensers for samplable, seed-dependent sources whose computational complexity is smaller than the size t of the adversarial sampling algorithm A. Indeed, we show that sufficiently strong collision-resistant hash functions are seed-dependent condensers that produce outputs with min-entropy kâČ = m â O(log t), i.e. logarithmic entropy deficiency.
â Randomness condensers suffice for key derivation in many crypto- graphic applications: when an adversary has negligible success proba- bility (or negligible âsquared advantageâ [3]) for a uniformly random key, we can use instead a key generated by a condenser whose output has logarithmic entropy deficiency.
â Randomness condensers for seed-dependent samplable sources that are robust to side information generated by the sampling algorithm imply soundness of the Fiat-Shamir Heuristic when applied to any constant-round, public-coin interactive proof system.Engineering and Applied Science
Unilaterally-Authenticated Key Exchange
Key Exchange (KE), which enables two parties (e.g., a client and a server) to securely establish a common private key while communicating over an insecure channel, is one of the most fundamental cryptographic primitives. In this work, we address the setting of unilaterally-authenticated key exchange (UAKE), where an unauthenticated (unkeyed) client establishes a key with an authenticated (keyed) server. This setting is highly motivated by many practical uses of KE on the Internet, but received relatively little attention so far.
Unlike the prior work, defining UAKE by downgrading a relatively complex definition of mutually authenticated key exchange (MAKE), our definition follows the opposite approach of upgrading existing definitions of public key encryption (PKE) and signatures towards UAKE. As a result, our new definition is short and easy to understand. Nevertheless, we show that it is equivalent to the UAKE definition of Bellare-Rogaway (when downgraded from MAKE), and thus captures a very strong and widely adopted security notion, while looking very similar to the simple ``one-oracle\u27\u27 definition of traditional PKE/signature schemes. As a benefit of our intuitive framework, we show two exactly-as-you-expect (i.e., having no caveats so abundant in the KE literature!) UAKE protocols from (possibly interactive) signature and encryption. By plugging various one- or two-round signature and encryption schemes, we derive provably-secure variants of various well-known UAKE protocols (such as a unilateral variant of SKEME with and without perfect forward secrecy, and Shoup\u27s A-DHKE-1), as well as new protocols, such as the first -round UAKE protocol which is both (passively) forward deniable and forward-secure.
To further clarify the intuitive connections between PKE/Signatures and UAKE, we define and construct stronger forms of (necessarily interactive) PKE/Signature schemes, called confirmed encryption and confidential authentication, which, respectively, allow the sender to obtain confirmation that the (keyed) receiver output the correct message, or to hide the content of the message being authenticated from anybody but the participating (unkeyed) receiver. Using confirmed PKE/confidential authentication, we obtain two concise UAKE protocols of the form: ``send confirmed encryption/confidential authentication of a random key .\u27\u2
Fuzzy Extractors: How to Generate Strong Keys from Biometrics and Other Noisy Data
We provide formal definitions and efficient secure techniques for
- turning noisy information into keys usable for any cryptographic
application, and, in particular,
- reliably and securely authenticating biometric data.
Our techniques apply not just to biometric information, but to any keying
material that, unlike traditional cryptographic keys, is (1) not reproducible
precisely and (2) not distributed uniformly. We propose two primitives: a
"fuzzy extractor" reliably extracts nearly uniform randomness R from its input;
the extraction is error-tolerant in the sense that R will be the same even if
the input changes, as long as it remains reasonably close to the original.
Thus, R can be used as a key in a cryptographic application. A "secure sketch"
produces public information about its input w that does not reveal w, and yet
allows exact recovery of w given another value that is close to w. Thus, it can
be used to reliably reproduce error-prone biometric inputs without incurring
the security risk inherent in storing them.
We define the primitives to be both formally secure and versatile,
generalizing much prior work. In addition, we provide nearly optimal
constructions of both primitives for various measures of ``closeness'' of input
data, such as Hamming distance, edit distance, and set difference.Comment: 47 pp., 3 figures. Prelim. version in Eurocrypt 2004, Springer LNCS
3027, pp. 523-540. Differences from version 3: minor edits for grammar,
clarity, and typo
Online Linear Extractors for Independent Sources
In this work, we characterize online linear extractors. In other words, given a matrix , we study the convergence of the iterated process , where is repeatedly sampled independently from some fixed (but unknown) distribution with (min)-entropy at least . Here, we think of as the state of an online extractor, and as its input.
As our main result, we show that the state converges to the uniform distribution for all input distributions with entropy if and only if the matrix has no non-trivial invariant subspace (i.e., a non-zero subspace such that ). In other words, a matrix yields an online linear extractor if and only if has no non-trivial invariant subspace. For example, the linear transformation corresponding to multiplication by a generator of the field yields a good online linear extractor. Furthermore, for any such matrix convergence takes at most steps.
We also study the more general notion of condensing---that is, we ask when this process converges to a distribution with entropy at least , when the input distribution has entropy greater than . (Extractors corresponding to the special case when .) We show that a matrix gives a good condenser if there are relatively few vectors such that are linearly dependent. As an application, we show that the very simple cyclic rotation transformation condenses to bits for any if is a prime satisfying a certain simple number-theoretic condition.
Our proofs are Fourier-analytic and rely on a novel lemma, which gives a tight bound on the product of certain Fourier coefficients of any entropic distribution
Immunizing Backdoored PRGs
A backdoored Pseudorandom Generator (PRG) is a PRG which looks pseudorandom to the outside world, but a saboteur can break PRG security by planting a backdoor into a seemingly honest choice of public parameters, , for the system. Backdoored PRGs became increasingly important due to revelations about NISTâs backdoored Dual EC PRG, and later results about its practical exploitability.
Motivated by this, at Eurocrypt\u2715 Dodis et al. [21] initiated the question of immunizing backdoored PRGs. A -immunization scheme repeatedly applies a post-processing function to the output of backdoored PRGs, to render any (unknown) backdoors provably useless. For , [21] showed that no deterministic immunization is possible, but then constructed seeded -immunizer either in the random oracle model, or under strong non-falsifiable assumptions. As our first result, we show that no seeded -immunization scheme can be black-box reduced to any efficiently falsifiable assumption.
This motivates studying -immunizers for , which have an additional advantage of being deterministic (i.e., seedless ). Indeed, prior work at CCS\u2717 [37] and CRYPTO\u2718 [7] gave supporting evidence that simple -immunizers might exist, albeit in slightly different settings. Unfortunately, we show that simple standard model proposals of [37, 7] (including the XOR function [7]) provably do not work in our setting. On a positive, we confirm the intuition of [37] that a (seedless) random oracle is a provably secure -immunizer. On a negative, no (seedless) -immunization scheme can be black-box reduced to any efficiently falsifiable assumption, at least for a large class of natural -immunizers which includes all cryptographic hash functions.
In summary, our results show that -immunizers occupy a peculiar place in the cryptographic world. While they likely exist, and can be made practical and efficient, it is unlikely one can reduce their security to a clean standard-model assumption
- âŠ